Markov measure

Markov measure
марковская мера

English-Russian Dictionary on Probability, Statistics, and Combinatorics. — Philadelphia and Moscow. Society for Industrial and Applied Mathematics and TVP Science Publishers. . 1994.

Игры ⚽ Поможем решить контрольную работу

Смотреть что такое "Markov measure" в других словарях:

  • Markov's inequality — gives an upper bound for the measure of the set (indicated in red) where f(x) exceeds a given level . The bound combines the level with the average value of f …   Wikipedia

  • Markov chain geostatistics — refer to the Markov chain models, simulation algorithms and associated spatial correlation measures (e.g., transiogram) based on the Markov chain random field theory, which extends a single Markov chain into a multi dimensional field for… …   Wikipedia

  • Markov chain — A simple two state Markov chain. A Markov chain, named for Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized …   Wikipedia

  • Markov network — A Markov network, or Markov random field, is a model of the (full) joint probability distribution of a set mathcal{X} of random variables having the Markov property. A Markov network is similar to a Bayesian network in its representation of… …   Wikipedia

  • Markov random field — A Markov random field, Markov network or undirected graphical model is a set of variables having a Markov property described by an undirected graph. A Markov random field is similar to a Bayesian network in its representation of dependencies. It… …   Wikipedia

  • Markov logic network — A Markov logic network (or MLN) is a probabilistic logic which applies the ideas of a Markov network to first order logic, enabling uncertain inference. Markov logic networks generalize first order logic, in the sense that, in a certain limit,… …   Wikipedia

  • Markov kernel — In probability theory, a Markov kernel is a map that plays the role, in the general theory of Markov processes, that the transition matrix does in the theory of Markov processes with a finite state space. Formal definition Let , be measurable… …   Wikipedia

  • Measure-preserving dynamical system — In mathematics, a measure preserving dynamical system is an object of study in the abstract formulation of dynamical systems, and ergodic theory in particular. Contents 1 Definition 2 Examples 3 Homomorphisms 4 …   Wikipedia

  • Markov additive process — In applied probability, a Markov additive process (MAP) {(X(t),J(t)) : t ≥ 0} is a bivariate Markov process whose transition probability measure is translation invariant in the additive component X(t). That is to say, the… …   Wikipedia

  • Gibbs measure — In mathematics, the Gibbs measure, named after Josiah Willard Gibbs, is a probability measure frequently seen in many problems of probability theory and statistical mechanics. It is the measure associated with the Boltzmann distribution, and… …   Wikipedia

  • Quantum Markov chain — In mathematics, the quantum Markov chain is a reformulation of the ideas of a classical Markov chain, replacing the classical definitions of probability with quantum probability. Very roughly, the theory of a quantum Markov chain resembles that… …   Wikipedia


Поделиться ссылкой на выделенное

Прямая ссылка:
Нажмите правой клавишей мыши и выберите «Копировать ссылку»